Lecture 21 : Low - rank Approximation with Element - wise Sampling

نویسنده

  • Michael Mahoney
چکیده

In particular, this means that smaller entries of A lead to random variables with smaller variance. On the other hand, the bound on ∥∥∥A− Â∥∥∥ 2 depends on the maximum variance. Thus, to improve the results, one idea is to keep entries of Aij with probability pij ≤ p, so that all entries in  have roughly the same variance. This will help us to get sparser matrices, while keeping similar quality-of-approximation bounds.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lecture 20 : Low - rank Approximation with Element - wise Sampling

So far, we have been talking about sampling/projection of rows/columns—i.e., we have been working with the actual columns/rows or linear combinations of the columns/rows of an input matrix A. Formally, this means that we are preor post-multiplying the input matrix A with a sampling/projection/sketching operator (that itself can be represented as a matrix) to construct another matrix A′ (with di...

متن کامل

Lecture 15 : Additive - error Low - rank Matrix Approximation with Sampling and Projections

• A spectral norm bound for reconstruction error for the basic low-rank approximation random sampling algorithm. • A discussion of how similar bounds can be obtained with a variety of random projection algorithms. • A discussion of possible ways to improve the basic additive error bounds. • An iterative algorithm that leads to additive error with much smaller additive scale. This will involve u...

متن کامل

Lecture 14: Additive-error Low-rank Matrix Approximation with Sampling and Projections

Today, we will shift gears and begin to discuss RandNLA algorithms for low-rank matrix approximation. We will start with additive-error low-rank matrix approximation with sampling and projections. These are of interest historically and since they illustrate several techniques (normsquared sampling, simple linear algebraic manipulations, the use of matrix perturbation theory, etc.), but they are...

متن کامل

Lecture 16: Relative-error Low-rank Matrix Approximation with Sampling and Projections

Today, we will start to discuss how to improve the rather coarse additive-error low-rank matrix approximation algorithms from the last two classes to obtain much better results for low-rank matrix approximation. Importantly, “better” means very different things to different research communities, and thus we will discuss several different notions of better. We will start by describing how to imp...

متن کامل

Tighter Low-rank Approximation via Sampling the Leveraged Element

In this work, we propose a new randomized algorithm for computing a low-rank approximation to a given matrix. Taking an approach different from existing literature, our method first involves a specific biased sampling, with an element being chosen based on the leverage scores of its row and column, and then involves weighted alternating minimization over the factored form of the intended low-ra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015